Noisy Matrix Completion Using Alternating Minimization
نویسندگان
چکیده
The task of matrix completion involves estimating the entries of a matrix, M ∈ Rm×n, when a subset, Ω ⊂ {(i, j) : 1 ≤ i ≤ m, 1 ≤ j ≤ n} of the entries are observed. A particular set of low rank models for this task approximate the matrix as a product of two low rank matrices, M̂ = UV T , where U ∈ Rm×k and V ∈ Rn×k and k min{m,n}. A popular algorithm of choice in practice for recovering M from the partially observed matrix using the low rank assumption is alternating least square (ALS) minimization, which involves optimizing over U and V in an alternating manner to minimize the squared error over observed entries while keeping the other factor fixed. Despite being widely experimented in practice, only recently were theoretical guarantees established bounding the error of the matrix estimated from ALS to that of the original matrix M . In this work we extend the results for a noiseless setting and provide the first guarantees for recovery under noise for alternating minimization. We specifically show that for well conditioned matrices corrupted by random noise of bounded Frobenius norm, if the number of observed entries is O ( kn logn ) , then the ALS algorithm recovers the original matrix within an error bound that depends on the norm of the noise matrix. The sample complexity is the same as derived in [7] for the noise–free matrix completion using ALS.
منابع مشابه
Provable Matrix Sensing using Alternating Minimization
Alternating minimization has emerged as a popular heuristic for large-scale machine learning problems involving low-rank matrices. However, there have been few (if any) theoretical guarantees on its performance. In this work, we investigate the natural alternating minimization algorithm for the popular matrix sensing problem first formulated in [RFP07]; this problem asks for the recovery of an ...
متن کاملRecovery guarantee of weighted low-rank approximation via alternating minimization
Many applications require recovering a ground truth low-rank matrix from noisy observations of the entries. In practice, this is typically formulated as a weighted low-rank approximation problem and solved using non-convex optimization heuristics such as alternating minimization. Such non-convex techniques have few guarantees. Even worse, weighted low-rank approximation is NP-hard for even the ...
متن کاملProvable Inductive Matrix Completion
Consider a movie recommendation system where apart from the ratings information, side information such as user’s age or movie’s genre is also available. Unlike standard matrix completion, in this setting one should be able to predict inductively on new users/movies. In this paper, we study the problem of inductive matrix completion in the exact recovery setting. That is, we assume that the rati...
متن کاملOn the Provable Convergence of Alternating Minimization for Matrix Completion
Alternating Minimization is a widely used and empirically successful framework for Matrix Completion and related low-rank optimization problems. We give a new algorithm based on Alternating Minimization that provably recovers an unknown low-rank matrix from a random subsample of its entries under a standard incoherence assumption while achieving a linear convergence rate. Compared to previous w...
متن کاملNear-optimal sample complexity for convex tensor completion
We analyze low rank tensor completion (TC) using noisy measurements of a subset of the tensor. Assuming a rank-$r$, order-$d$, $N \times N \times \cdots \times N$ tensor where $r=O(1)$, the best sampling complexity that was achieved is $O(N^{\frac{d}{2}})$, which is obtained by solving a tensor nuclear-norm minimization problem. However, this bound is significantly larger than the number of fre...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2013